skip to main content


Search for: All records

Creators/Authors contains: "Bevilacqua, Andrea"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. null (Ed.)
    Statistical emulators are a key tool for rapidly producing probabilistic hazard analysis of geophysical processes. Given output data computed for a relatively small number of parameter inputs, an emulator interpolates the data, providing the expected value of the output at untried inputs and an estimate of error at that point. In this work, we propose to fit Gaussian Process emulators to the output from a volcanic ash transport model, Ash3d. Our goal is to predict the simulated volcanic ash thickness from Ash3d at a location of interest using the emulator. Our approach is motivated by two challenges to fitting emulators—characterizing the input wind field and interactions between that wind field and variable grain sizes. We resolve these challenges by using physical knowledge on tephra dispersal. We propose new physically motivated variables as inputs and use normalized output as the response for fitting the emulator. Subsetting based on the initial conditions is also critical in our emulator construction. Simulation studies characterize the accuracy and efficiency of our emulator construction and also reveal its current limitations. Our work represents the first emulator construction for volcanic ash transport models with considerations of the simulated physical process. 
    more » « less
  2. Abstract

    Stromboli volcano (Italy), always active with low energy explosive activity, is a very attractive place for visitors, scientists, and inhabitants of the island. Nevertheless, occasional more intense eruptions can present a serious danger. This study focuses on the modeling and estimation of their inter-event time and temporal rate. With this aim we constructed a new historical catalog of major explosions and paroxysms through a detailed review of scientific literature of the last ca. 140 years. The catalog includes the calendar date and phenomena descriptions for 180 explosive events, of which 36 were paroxysms. We evaluated the impact of the main sources of uncertainty affecting the historical catalog. In particular, we categorized as uncertain 45 major explosions that reportedly occurred before 1985 and tested the effect of excluding these events from our analysis. Moreover, after analyzing the entire record in the period [1879, 2020], we separately considered, as sequences, events in [1879, 1960] and in [1985, 2020] because of possible under recording issues in the period [1960, 1985]. Our new models quantify the temporal rate of major explosions and paroxysms as a function of time passed since the last event occurred. Recurrence hazard levels are found to be significantly elevated in the weeks and months following a major explosion or paroxysm, and then gradually decrease over longer periods. Computed hazard functions are also used to illustrate a methodology for estimating order-of-magnitude individual risk of fatality under certain basis conditions. This study represents a first quantitatively formal advance in determining long-term hazard levels at Stromboli.

     
    more » « less
  3. null (Ed.)
  4. null (Ed.)
    Abstract This study presents a new method, called the Radial Interpolation Method, to interpolate data characterized by an approximately radial pattern around a relatively constrained central zone, such as the ground deformation patterns shown in many active volcanic areas. The method enables the fast production of short-term deformation maps on the base of spatially sparse ground deformation measurements and can provide uncertainty quantification on the interpolated values, fundamental for hazard assessment purposes and deformation source reconstruction. The presented approach is not dependent on a priori assumptions about the geometry, location and physical properties of the source, except for the requirement of a locally radial pattern, i.e., allowing multiple centers of symmetry. We test the new method on a synthetic point source example, and then, we apply the method to selected time intervals of real geodetic data collected at the Campi Flegrei caldera during the last 39 years, including examples of leveling, Geodetic Precise Traversing measurements and Global Positioning System. The maps of horizontal displacement, calculated inland, show maximum values lying along a semicircular annular region with a radius of about 2–3 km in size. This semi-annular area is marked by mesoscale structures such as faults, sand dikes and fractures. The maps of vertical displacement describe a linear relation between the maximum vertical uplift measured and the volume variation. The multiplicative factor in the linear relation is about 0.3 × 10 6  m 3 /cm if we estimate the proportion of the Δ V that is captured by the GPS network onland and we use this to estimate the full Δ V . In this case, the 95% confidence interval on K because of linear regression is ± 5%. Finally, we briefly discuss how the new method could be used for the production of short-term vent opening maps on the base of real-time geodetic measurements of the horizontal and vertical displacements. 
    more » « less
  5. Abstract. We detail a new prediction-oriented procedure aimed at volcanic hazardassessment based on geophysical mass flow models constrained withheterogeneous and poorly defined data. Our method relies on an itemizedapplication of the empirical falsification principle over an arbitrarily wideenvelope of possible input conditions. We thus provide a first step towards aobjective and partially automated experimental design construction. Inparticular, instead of fully calibrating model inputs on past observations,we create and explore more general requirements of consistency, and then weseparately use each piece of empirical data to remove those input values thatare not compatible with it. Hence, partial solutions are defined to the inverseproblem. This has several advantages compared to a traditionally posedinverse problem: (i) the potentially nonempty inverse images of partialsolutions of multiple possible forward models characterize the solutions tothe inverse problem; (ii) the partial solutions can provide hazard estimatesunder weaker constraints, potentially including extreme cases that areimportant for hazard analysis; (iii) if multiple models are applicable,specific performance scores against each piece of empirical information canbe calculated. We apply our procedure to the case study of the Atenquiquevolcaniclastic debris flow, which occurred on the flanks of Nevado de Colimavolcano (Mexico), 1955. We adopt and compare three depth-averaged modelscurrently implemented in the TITAN2D solver, available from https://vhub.org(Version 4.0.0 – last access: 23 June 2016). The associated inverse problemis not well-posed if approached in a traditional way. We show that our procedurecan extract valuable information for hazard assessment, allowing the explorationof the impact of synthetic flows that are similar to those that occurred in thepast but different in plausible ways. The implementation of multiple models isthus a crucial aspect of our approach, as they can allow the covering of otherplausible flows. We also observe that model selection is inherently linked tothe inversion problem.

     
    more » « less
  6. Abstract

    Ideally, probabilistic hazard assessments combine available knowledge about physical mechanisms of the hazard, data on past hazards, and any precursor information. Systematically assessing the probability of rare, yet catastrophic hazards adds a layer of difficulty due to limited observation data. Via computer models, one can exercise potentially dangerous scenarios that may not have happened in the past but are probabilistically consistent with the aleatoric nature of previous volcanic behavior in the record. Traditional Monte Carlo‐based methods to calculate such hazard probabilities suffer from two issues: they are computationally expensive, and they are static. In light of new information, newly available data, signs of unrest, and new probabilistic analysis describing uncertainty about scenarios the Monte Carlo calculation would need to be redone under the same computational constraints. Here we present an alternative approach utilizing statistical emulators that provide an efficient way to overcome the computational bottleneck of typical Monte Carlo approaches. Moreover, this approach is independent of an aleatoric scenario model and yet can be applied rapidly to any scenario model making it dynamic. We present and apply this emulator‐based approach to create multiple probabilistic hazard maps for inundation of pyroclastic density currents in the Long Valley Volcanic Region. Further, we illustrate how this approach enables an exploration of the impact of epistemic uncertainties on these probabilistic hazard forecasts. Particularly, we focus on the uncertainty of vent opening models and how that uncertainty both aleatoric and epistemic impacts the resulting probabilistic hazard maps of pyroclastic density current inundation.

     
    more » « less